CHING-YI TSAI

Gait Gestures
Examining Stride and Foot Strike Variation as an Input Method While Walking
UIST'24 PAPER | 13 OCT 2024

-->

Three of our designed walking gestures for discrete input.
AR users use slight walking variation to invocate app commands, including apps selection, food ordering, and volume slider.

Intro

In a world increasingly dominated by screens, our physical movements often feel at odds with the digital realm. We pause our strides to type messages, halt our journeys to adjust music, and interrupt our explorations to answer calls. Yet, what if the very act of walking itself could become a seamless and intuitive form of interaction? This research explores Gait Gestures, an new approach that transforms the subtle variations in our strides and foot strikes into an input language understood by augmented reality (AR) interfaces. In this projects, we explores the feasibility by investigating different gait gestures in terms of user experience and recognizability, followed by a series of “walk”-through with common AR apps. This approach reimagines the way we interact with technology while on the move. Instead of forcing a break in our natural flow, Gait Gestures allow us to perform digital interactions in harmony with our walking experience.

Project Inspiration: Creativity and Magic in Steps

I like to take a walk—it’s my go-to way to decompress after a long day and an essential part of my creative thinking process (Related reading: Give Your Ideas Some Legs: The Positive Effect of Walking on Creative Thinking). Beyond that, this project is inspired by fantastical tales where a character’s steps conjure magical spells or reveal hidden paths. Just like a princess’s twirl might summon a shimmering gown or a hero’s leap could ignite a blazing sword, what if a well-placed foot tap could activate a music playlist, or an elongated stride could adjust the volume?